50 research outputs found

    Small nets and short paths optimising neural computation

    Get PDF

    Population-based continuous optimization, probabilistic modelling and mean shift

    Get PDF
    Evolutionary algorithms perform optimization using a population of sample solution points. An interesting development has been to view population-based optimization as the process of evolving an explicit, probabilistic model of the search space. This paper investigates a formal basis for continuous, population-based optimization in terms of a stochastic gradient descent on the Kullback-Leibler divergence between the model probability density and the objective function, represented as an unknown density of assumed form. This leads to an update rule that is related and compared with previous theoretical work, a continuous version of the population-based incremental learning algorithm, and the generalized mean shift clustering framework. Experimental results are presented that demonstrate the dynamics of the new algorithm on a set of simple test problems

    Exploring the potential of neural networks to predict statistics of solar wind turbulence

    Full text link
    Time series datasets often have missing or corrupted entries, which need to be ignored in subsequent data analysis. For example, in the context of space physics, calibration issues, satellite telemetry issues, and unexpected events can make parts of a time series unusable. Various approaches exist to tackle this problem, including mean/median imputation, linear interpolation, and autoregressive modeling. Here we study the utility of artificial neural networks (ANNs) to predict statistics, particularly second-order structure functions, of turbulent time series concerning the solar wind. Using a dataset with artificial gaps, a neural network is trained to predict second-order structure functions and then tested on an unseen dataset to quantify its performance. A small feedforward ANN, with only 20 hidden neurons, can predict the large-scale fluctuation amplitudes better than mean imputation or linear interpolation when the percentage of missing data is high. Although, they perform worse than the other methods when it comes to capturing both the shape and fluctuation amplitude together, their performance is better in a statistical sense for large fractions of missing data. Caveats regarding their utility, the optimisation procedure, and potential future improvements are discussed.Comment: 17 pages, 5 figures, 2 table

    Adaptation and enslavement in endosymbiont-host associations

    Full text link
    The evolutionary persistence of symbiotic associations is a puzzle. Adaptation should eliminate cooperative traits if it is possible to enjoy the advantages of cooperation without reciprocating - a facet of cooperation known in game theory as the Prisoner's Dilemma. Despite this barrier, symbioses are widespread, and may have been necessary for the evolution of complex life. The discovery of strategies such as tit-for-tat has been presented as a general solution to the problem of cooperation. However, this only holds for within-species cooperation, where a single strategy will come to dominate the population. In a symbiotic association each species may have a different strategy, and the theoretical analysis of the single species problem is no guide to the outcome. We present basic analysis of two-species cooperation and show that a species with a fast adaptation rate is enslaved by a slowly evolving one. Paradoxically, the rapidly evolving species becomes highly cooperative, whereas the slowly evolving one gives little in return. This helps understand the occurrence of endosymbioses where the host benefits, but the symbionts appear to gain little from the association.Comment: v2: Correction made to equations 5 & 6 v3: Revised version accepted in Phys. Rev. E; New figure adde

    Optimization and Probabilistic Modelling

    No full text
    Evolutionary algorithms perform optimization using a population of sample solution points. An interesting development has been to view population-based optimization as the process of evolving an explicit, probabilistic model of the search space. This paper investigates a formal basis for population-based optimization in terms of a stochastic gradient descent on the Kullback-Leibler divergence between the model probability density and the objective function, represented as an unknown density of assumed form. Existing continuous evolutionary algorithms can be viewed from a probabilistic modelling perspective as approximating this gradient descent method. Experimental results demonstrate the method on a search problem with substantial additive noise

    Multiple output gaussian process regression

    No full text
    Gaussian processes are usually parameterised in terms of their covariance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white noise sources convolved with smoothing kernels, and to parameterise the kernel instead. Using this, we extend Gaussian processes to handle multiple, coupled outputs.
    corecore